skip to main content


Search for: All records

Creators/Authors contains: "Young, Lai-Sang"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. This paper is about a class of stochastic reaction networks. Of interest are the dynamics of interconversion among a finite number of substances through reactions that consume some of the substances and produce others. The models we consider are continuous-time Markov jump processes, intended as idealizations of a broad class of biological networks. Reaction rates depend linearly on “enzymes,” which are among the substances produced, and a reaction can occur only in the presence of sufficient upstream material. We present rigorous results for this class of stochastic dynamical systems, the mean-field behaviors of which are described by ordinary differential equations (ODEs). Under the assumption of exponential network growth, we identify certain ODE solutions as being potentially traceable and give conditions on network trajectories which, when rescaled, can with high probability be approximated by these ODE solutions. This leads to a complete characterization of the ω -limit sets of such network solutions (as points or random tori). Dimension reduction is noted depending on the number of enzymes. The second half of this paper is focused on depletion dynamics, i.e., dynamics subsequent to the “phase transition” that occurs when one of the substances becomes unavailable. The picture can be complex, for the depleted substance can be produced intermittently through other network reactions. Treating the model as a slow–fast system, we offer a mean-field description, a first step to understanding what we believe is one of the most natural bifurcations for reaction networks. 
    more » « less
  2. The brain produces rhythms in a variety of frequency bands. Some are likely by-products of neuronal processes; others are thought to be top-down. Produced entirely naturally, these rhythms have clearly recognizable beats, but they are very far from periodic in the sense of mathematics. The signals are broad-band, episodic, wandering in amplitude and frequency; the rhythm comes and goes, degrading and regenerating. Gamma rhythms, in particular, have been studied by many authors in computational neuroscience, using reduced models as well as networks of hundreds to thousands of integrate-and-fire neurons. All of these models captured successfully the oscillatory nature of gamma rhythms, but the irregular character of gamma in reduced models has not been investigated thoroughly. In this article, we tackle the mathematical question of whether signals with the properties of brain rhythms can be generated from low dimensional dynamical systems. We found that while adding white noise to single periodic cycles can to some degree simulate gamma dynamics, such models tend to be limited in their ability to capture the range of behaviors observed. Using an ODE with two variables inspired by the FitzHugh-Nagumo and Leslie-Gower models, with stochastically varying coefficients designed to control independently amplitude, frequency, and degree of degeneracy, we were able to replicate the qualitative characteristics of natural brain rhythms. To demonstrate model versatility, we simulate the power spectral densities of gamma rhythms in various brain states recorded in experiments. 
    more » « less
  3. Abstract

    In this paper we present a rigorous analysis of a class of coupled dynamical systems in which two distinct types of components, one excitatory and the other inhibitory, interact with one another. These network models are finite in size but can be arbitrarily large. They are inspired by real biological networks, and possess features that are idealizations of those in biological systems. Individual components of the network are represented by simple, much studied dynamical systems. Complex dynamical patterns on the network level emerge as a result of the coupling among its constituent subsystems. Appealing to existing techniques in (nonuniform) hyperbolic theory, we study their Lyapunov exponents and entropy, and prove that large time network dynamics are governed by physical measures with the SRB property.

     
    more » « less
  4. Rubin, Jonathan (Ed.)
    Constraining the many biological parameters that govern cortical dynamics is computationally and conceptually difficult because of the curse of dimensionality. This paper addresses these challenges by proposing (1) a novel data-informed mean-field (MF) approach to efficiently map the parameter space of network models; and (2) an organizing principle for studying parameter space that enables the extraction biologically meaningful relations from this high-dimensional data. We illustrate these ideas using a large-scale network model of the Macaque primary visual cortex. Of the 10-20 model parameters, we identify 7 that are especially poorly constrained, and use the MF algorithm in (1) to discover the firing rate contours in this 7D parameter cube. Defining a “biologically plausible” region to consist of parameters that exhibit spontaneous Excitatory and Inhibitory firing rates compatible with experimental values, we find that this region is a slightly thickened codimension-1 submanifold. An implication of this finding is that while plausible regimes depend sensitively on parameters, they are also robust and flexible provided one compensates appropriately when parameters are varied. Our organizing principle for conceptualizing parameter dependence is to focus on certain 2D parameter planes that govern lateral inhibition: Intersecting these planes with the biologically plausible region leads to very simple geometric structures which, when suitably scaled, have a universal character independent of where the intersections are taken. In addition to elucidating the geometry of the plausible region, this invariance suggests useful approximate scaling relations. Our study offers, for the first time, a complete characterization of the set of all biologically plausible parameters for a detailed cortical model, which has been out of reach due to the high dimensionality of parameter space. 
    more » « less
  5. null (Ed.)
  6. Zhou, Dongzhuo Douglas (Ed.)
    This paper uses mathematical modeling to study the mechanisms of surround suppression in the primate visual cortex. We present a large-scale neural circuit model consisting of three interconnected components: LGN and two input layers (Layer 4Ca and Layer 6) of the primary visual cortex V1, covering several hundred hypercolumns. Anatomical structures are incorporated and physiological parameters from realistic modeling work are used. The remaining parameters are chosen to produce model outputs that emulate experimentally observed size-tuning curves. Our two main results are: (i) we discovered the character of the long-range connections in Layer 6 responsible for surround effects in the input layers; and (ii) we showed that a net-inhibitory feedback, i.e., feedback that excites I-cells more than E-cells, from Layer 6 to Layer 4 is conducive to producing surround properties consistent with experimental data. These results are obtained through parameter selection and model analysis. The effects of nonlinear recurrent excitation and inhibition are also discussed. A feature that distinguishes our model from previous modeling work on surround suppression is that we have tried to reproduce realistic lengthscales that are crucial for quantitative comparison with data. Due to its size and the large number of unknown parameters, the model is computationally challenging. We demonstrate a strategy that involves first locating baseline values for relevant parameters using a linear model, followed by the introduction of nonlinearities where needed. We find such a methodology effective, and propose it as a possibility in the modeling of complex biological systems. 
    more » « less
  7. This paper offers a theory for the origin of direction selectivity (DS) in the macaque primary visual cortex, V1. DS is essential for the perception of motion and control of pursuit eye movements. In the macaque visual pathway, neurons with DS first appear in V1, in the Simple cell population of the Magnocellular input layer 4Cα. The lateral geniculate nucleus (LGN) cells that project to these cortical neurons, however, are not direction selective. We hypothesize that DS is initiated in feed-forward LGN input, in the summed responses of LGN cells afferent to a cortical cell, and it is achieved through the interplay of 1) different visual response dynamics of ON and OFF LGN cells and 2) the wiring of ON and OFF LGN neurons to cortex. We identify specific temporal differences in the ON/OFF pathways that, together with item 2, produce distinct response time courses in separated subregions; analysis and simulations confirm the efficacy of the mechanisms proposed. To constrain the theory, we present data on Simple cells in layer 4Cα in response to drifting gratings. About half of the cells were found to have high DS, and the DS was broadband in spatial and temporal frequency (SF and TF). The proposed theory includes a complete analysis of how stimulus features such as SF and TF interact with ON/OFF dynamics and LGN-to-cortex wiring to determine the preferred direction and magnitude of DS.

     
    more » « less
  8. Abstract

    In neuroscience, computational modeling is an effective way to gain insight into cortical mechanisms, yet the construction and analysis of large-scale network models—not to mention the extraction of underlying principles—are themselves challenging tasks, due to the absence of suitable analytical tools and the prohibitive costs of systematic numerical exploration of high-dimensional parameter spaces. In this paper, we propose a data-driven approach assisted by deep neural networks (DNN). The idea is to first discover certain input-output relations, and then to leverage this information and the superior computation speeds of the well-trained DNN to guide parameter searches and to deduce theoretical understanding. To illustrate this novel approach, we used as a test case a medium-size network of integrate-and-fire neurons intended to model local cortical circuits. With the help of an accurate yet extremely efficient DNN surrogate, we revealed the statistics of model responses, providing a detailed picture of model behavior. The information obtained is both general and of a fundamental nature, with direct application to neuroscience. Our results suggest that the methodology proposed can be scaled up to larger and more complex biological networks when used in conjunction with other techniques of biological modeling.

     
    more » « less